Canonical dependency analysis based on squared-loss mutual information
نویسندگان
چکیده
منابع مشابه
Canonical dependency analysis based on squared-loss mutual information
Canonical correlation analysis (CCA) is a classical dimensionality reduction technique for two sets of variables that iteratively finds projection directions with maximum correlation. Although CCA is still in vital use in many practical application areas, recent real-world data often contain more complicated nonlinear correlations that cannot be properly captured by classical CCA. In this paper...
متن کاملInformation-Maximization Clustering Based on Squared-Loss Mutual Information
Information-maximization clustering learns a probabilistic classifier in an unsupervised manner so that mutual information between feature vectors and cluster assignments is maximized. A notable advantage of this approach is that it involves only continuous optimization of model parameters, which is substantially simpler than discrete optimization of cluster assignments. However, existing metho...
متن کاملEstimating Squared-Loss Mutual Information for Independent Component Analysis
Accurately evaluating statistical independence among random variables is a key component of Independent Component Analysis (ICA). In this paper, we employ a squared-loss variant of mutual information as an independence measure and give its estimation method. Our basic idea is to estimate the ratio of probability densities directly without going through density estimation, by which a hard task o...
متن کاملMachine Learning with Squared-Loss Mutual Information
Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback–Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its P...
متن کاملComputationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information
The purpose of sufficient dimension reduction (SDR) is to find a low-dimensional expression of input features that is sufficient for predicting output values. In this paper, we propose a novel distribution-free SDR method called sufficient component analysis (SCA), which is computationally more efficient than existing methods. In our method, a solution is computed by iteratively performing depe...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Networks
سال: 2012
ISSN: 0893-6080
DOI: 10.1016/j.neunet.2012.06.009